Page tree
Skip to end of metadata
Go to start of metadata


This manual provides guidelines for the extension administration and management from a technical perspective. It is met to be used by technical users and system administrators that will be responsible for the Tableau Extension.

Write-Back Overview

How is Write-Back deployed

  • Write-Back Server
    • Tableau extensions are web based and deployed on a separate application server from the Tableau Server. This means that Write-Back should be placed side by side with your Tableau Server, ideally on a separate machine. Users will interact with the Write-Back server either on the Tableau Desktop or on Tableau Server, the behavior is very similar as even on the Tableau Desktop extensions are executed on an embedded browser.
    • The Write-Back server location is defined on the Tableau dashboard by adding a trex file. This is the file that tells Tableau where is the Write-Back server, since Write-Back runs on your environment you should always select my extensions and locate the file that is provided after finishing the installation process. Bear in mind that all Creators that will build dashboard with Write-Back need to have the corresponding trex file in hand.
  • Write-Back dataset Tables
    • Write-Back acts as SQL engine storing the data submitted by users on tables. Whenever a Creator makes a new configuration on Write-Back a table is provisioned on the Write-Back database. Since a table is created for each dataset it is advisable to have Write-Back configured to use a separate schema on your database. This way Write-Back environment is more contained avoiding issues and you always know for sure that every table has the same origin. In order to achieve this, a database user with full permissions on the schema is configured on the Write-Back back-end. You should also have a read-only database user that can be leveraged by Tableau Creators to build dashboards with data sources pointing at Write-Back data. Write-Back table structure only changes if the configuration is changed so bearing this in mind you can leverage this fact by following our Back-end Manual Data Procedures and apply ETL, store procedures or any other mechanism on top of the Write-Back tables.
    • By writing to a separate dataset and with the audit mechanism we are sure that, even though we are giving freedom for the Creators and Explorers to be creative and use Write-Back in different ways, the original data is never touched and we can always keep track of who did what. This detailed audit trace can be very important in different reasons:

      • Along time users might introduce inaccurate inputs, and you want to know what was the original submission

      • Any possible mistakes that might arise can always be fixed, no data is overlapped

      • To ensure compliance, you know who did what 

    • That is why a separate dataset is the best choice, since the extension is fully managing it Write-Back can ensure all the premises above despite of user actions
  • Data Warehouse
    • Your existing data will reside on a separate schema or even a separate database and thus you will need to blend it with Write-Back data by leveraging different mechanisms:
      • Tableau
        • Data blending can be used to associate the Write-Back dataset entries with existing data, the fields chosen from the source sheet can be used as linking fields. For more information check the Tableau documentation. While being easy to use this can have an impact on performance as the joins are done in memory on Tableau. 
        • Relationships and Data source joins allow combining data from different tables that might even be on different databases and thus connect the dataset with exiting data. For more information check the Tableau documentation. If you have all data on the same database this option is easy to apply and performance should not be affected, Tableau will push the joins to the database. 
      • Database
        • Create database views.Your existing data will reside on a separate schema or even a separate database and thus you will need to blend it with Write-Back data by leveraging different mechanisms:
  • Back-end data processes
    • While Write-Back does not provide OOTB any backend processes it is possible for technical users to implement them. These processes can be responsible for automatic tasks such as:
      • Pre-populating datasets, for instace with a basis for forecasting
      • Cleaning, old data or not relevant anymore
      • Syncronizing with other databases, for instance with an analytical database 
    • These can be implemented in different ways using:
      • Database
        • Use database triggers that based on the Write-Back dataset update or insert data on other table. 
        • Using stored procedures
      • Extract Transform and Load (ETL)
        • On your organization an ETL tool might be used for other purposes and be re-used here to update or insert data on other tables. 
    • Please be aware that these mechanisms can overlap existing data that might not be recoverable without restoring a data base backup, use them wisely implementing any particular validations required to ensure no data is lost inadvertently. There will be a delay on making the data available but these techniques can improve performance while the data is being read on the dashboards.  Fore more technical details please check the section BacBack-end Data Processes.

  • Security
    • Since Write-Back runs on a separate server we need to ensure that security is also applied here. Write-Back supports multiple authentication mechanisms including single sign-on technologies allowing to match the security on the Tableau Server and simplify maintenance always keeping users synchronized. 
    • For authorization you should resort to Tableau Server security ensuring that dashboards with Write-Back configured are only accessible by users that are entitled to submit information. 

How Write-Back works

  1. Creator and explorers users open the Tableau Desktop and adds the extension component to a dashboard that will be used to input data. This dashboard contains not only the extension but also a sheet that is used as a source to drive the data to which the user can add information. After adding the extension component a trex file should be chosen, this is the file that tells Tableau where is the Write-Back server, since Write-Back runs on your environment you should always select my extensions and make sure that all Creators have the corresponding trex file in hand.
  2. The user configures the extension by specifying the additional fields and types as well as the table where the data set is going to be persisted. When he clicks on submit this information is sent to the extension backend.
  3. The extension backend is responsible to manage the configurations as well as creating the new data set. This means dynamically creating the tables that will support the data set.
  4. The power user publishes the dashboard to the Tableau Server ensuring that only the right users have permissions to see the dashboard and consequently be able to input data.
  5. Tableau Server users can now select existing data and append additional data through the extension, as example comments or forecast deltas.
  6. The backend is responsible to insert the data submitted by the user on the data set configured.
  7. The data stored can then be used as any regular data source configured on Tableau Desktop, either live or as an extract.
  8. 3.1 and 6.1 happen just after the corresponding action is done and correspond to auditing all actions done by users. Audit information is kept on a separate table.

Technical Details

Authentication Workflow with SAML or OpenID

With SAML or OpenID enabled as authentication options, the application experience is a little different since a new popup will open where the requests to the IDP will happen. As such, when the authentication process is being triggered a new tableau pop-up will come up. This correponds to the requet for the IDP to authenticated the user.  Below are the the screens that will show up.

This one means a new popup will open in order to request authentication.

The below screens show on the new popup after the requests to the IDP happen.

Here you will receive a message regarding the authentication sucessOr if something failed you will be redirect to this error screen

If you close the window manually and do not let the login process complete you will be redirected to this page.

Back-end Tables

The extension uses internally a set of pre-defined tables to store its configurations and audit. These tables should not be manipulated externally. 

Historical Audit

The historical_audit table is automatically populated by the extension keeping track of all the actions taken by users. Each entry has the following fields:

ID1234a678-9b01-23c4-5d6e-7f89a0123b45ID - primary key. (UUID)
ACTIONDATE_LOCAL2019-01-21 14:24:47Date and time of the user's action. The date is stored in the server time zone.
ACTIONDATE_UTC2019-01-21 12:24:47Date and time of the user's action. The date is stored in UTC.
ACTIONNAMEALTER_DATASETThe type of user's action. Please find below a list with the actions available.
ACTIONPARAMSrenameColumn;dropColumn;modifyColumn;addColumnDetails of a user's action.


forecasts_modificationsName of the data set.
DATASETKEY1234a678-9b01-23c4-5d6e-7f89a0123b45Record's key in the dataset. (UUID)
SYSTEMServerThe system where the user performs actions. Possible values include 'Server' or 'Desktop'.
USERNAMEusernameThe user that performed the action.
IDWIDGETCONFIGURATION1234a678-9b01-23c4-5d6e-7f89a0123b45The ID of the Widget Configuration.

The action names considered are the following:

NameAction Detail


A Tableau Desktop user configures a new extension


A Tableau Desktop user deletes a data set

ALTER_DATASETA Tableau Desktop user edits the extension configuration and this has an impact on the layout of the data set (adding or removing fields as well as changing the name)
DELETE_DATAThe user either on Desktop or Server deletes a data entry
INSERT_DATAThe user either on Desktop or Server submits a data entry
EDIT_DATAThe user either on Desktop or Server edits an existing data entry

Widget Configuration

The widget_config table stores metadata of the dataset.

ID1234a678-9b01-23c4-5d6e-7f89a0123b45ID - primary key. (UUID)
TIMESTAMP_LOCAL2019-01-21 14:24:47Date and time when the table was created, stored in the server time zone.
TIMESTAMP_UTC2019-01-21 12:24:47Date and time when the table was created, stored in UTC.
USERNAMEusernameThe user that created the dataset
DISPLAYNAMEForecasts ModificationsUser defined name of the dataset.


forecasts_modificationsName of the dataset's table. This name is taken from the DISPLAYNAME.
WORKSHEET2019 ProfitName of the worksheet used to configure the extension.
ISACTIVE1/trueBoolean that tells if the dataset has been deleted (false) or not (true).

Widget Field Configuration

The widget_field_config table stores metadata of the dataset's fields.

ID1234a678-9b01-23c4-5d6e-7f89a0123b45ID - primary key. (UUID)
FIELDNAMEsales_forecastTechnical name of the field based of the user input.
DISPLAYFIELDNAMESales ForecastUser input of the field name.
ISKEY1/trueA key field means that it came from the worksheet. A field that is not key was user generated.


1234a678-9b01-23c4-5d6e-7f89a0123b45ID of the dataset that the field belongs to. (Foreign key)

Widget Field Values

The widget_field_values table stores metadata of the dataset's fields.

ID1234a678-9b01-23c4-5d6e-7f89a0123b45ID - primary key. (UUID)
FIELDVALUEOk/CancelValue of one of the options of the field.


1234a678-9b01-23c4-5d6e-7f89a0123b45ID of the predefined value it belongs to. (Foreign key to the ID in the widget_field_config table)

Data-set Repository

When a dataset is created a table is also created in the database to store its values. Take the "Sales Forecast" dataset for example, when creating it will create a new table named "sales_forecast" with the defined fields on the configuration.

The creation of a dataset will also add a new line to the historical audit saying that a dataset was created, it will also add a line do the widget configuration and finally, all the defined fields will be added to the widget fields configuration.

Hyper Files

With the support for Hyper Files as a destination for data, we made the decision to separate where the configuration tables and the dataset tables are stored. When using hyper as a persistence type for the extension be aware that you still need a SQL database (one of the supported ones) that is where the extension will write the metadata for the data you inserted in the Hyper File.

Environment Integration

Check the Manual Installation page for pre-requisites and configurations.

On-going Maintenance

Changing License Key

When you get a new license key you must add it at the end of the config.yml file, located at {tomcat dir}\webapps\twbe\WEB-INF\classes folder and restart the web application or tomcat. 

       licence-key: {key generated}


Monitoring can be performed by analyzing the Historical Audit table.


The configuration file can be found in the folder of the extension on the server <extension_folder>/WEB-INF/classes/config.yml

Every time the configuration is changed, the server should be restarted. Another way of changing the configuration, as detailed in the installation manual, is to extract the war, with 7zip for example, edit the config.yml there and then compress again as a .war. This former procedure should be made outside of the webapps folder.

More configuration details can be found in the installation manual.

Start / Stop

The extension can be started and stopped by simply moving the .war file somewhere else other than the webapps folder, waiting for Tomcat to remove the folder and the place the .war file back in again. Be aware that the config file might have changed, as it will get this file again from inside the war.

Another way is to just simply restart the Tomcat server.


The maintenance performed by the administrator is mainly focused on the audit table, although, if there are any unexpected issues, the extension has a logging system that will output errors. This logs can usually be found in <tomcat_installation>/temp/spring.log or in /logs/logger.log.

Error Handling

Existing errors considered

To handle errors in the back-end some error codes have been created for some standard operations with the endpoints. They can be found in the following table. They are sent in the HTTP Response body. Please always recur to logs to find more detail about what happened.

The logs are found in the log folder of tomcat, and also in the logger file in the tomcat directory. The spring related logs (also important) are found in the temp folder, in the tomcat directory. When an error code is returned and there is no representation for it the error message will clarify it is an undefined error.

HTTP Error CodeHTTP Error Code DescriptionBack-end ErrorMessageDescription




Generic error. Contact support.

When a general error happens. This is used if the category of error does not fit any of the other categories. Check logs for more info and contact support if needed.

Database appears to be down. Contact administrator.

This error happens if the backend could not establish a connection to a database. Check if the database is available.

Not possible to create configuration with the provided input.

When creating the database table fails, for some reason that is not expected. Check logs for help. CREATE TABLE

There was an unexpected error while deleting the dataset.

When deleting the database table fails. DROP TABLE

Not possible to alter configuration with the provided input.

When altering the database table fails, for some reason that is not expected. Check logs for more detail. SQL actions => ALTER TABLE | MODIFY COLUMN | RENAME TO ... CHANGE COLUMN


The configuration you attempted to create already exists.

When attempting to create a database table that it already exists or alter the name of an existing table to one that already exists. This usually should not happen as there is a condition on front end that blocks it.


The configuration has duplicated pill names.

When inserting more multiple foreign keys with the same name. (PersistenceEntryExcenption)

The configuration has fields with the same name.

When attempting to submit a configuration or edit one and adding two field names identical. This usually should not happen as there is a condition on front end that blocks it.


Couldn't delete the configuration because it was not found.

When trying to delete a database that does not exist.


We couldn't find the query on the queries file. Contact support.

Errors related to the statement creation for the dataset. This usually happens when the query couldn't be found in the query file If this error is found while creating widget or altering widget it might mean that a manual rollback might be necessary, so check logs for the fatal warning.


The action was not audited. Manual rollback might be required. Contact administrator.

Errors when inserting on the audit. (should trigger manual rollback on dataset when DDL rollback is not supported)


Configuration was not created due to an unexpected error. Contact administrator.

Errors when creating widget. This error can be quite damaging since it means only the configuration was not created. Which might mean that the dataset was. Check the logs for a fatal error. You might need a manual rollback. (should trigger manual rollback on dataset when DDL rollback is not supported). Make sure to check logs.


An unexpected error occurred and the widget configuration was not deleted.

Errors when deleting widget.


The configuration was not altered due to an unexpected error. Contact administrator.

Errors when altering widget. (should trigger manual rollback on dataset when DDL rollback is not supported). Make sure to check logs.


There was an unexpected error while inserting data. Contact support.

Errors when inserting data in the dataset. Usually, because the query is not built properly with the input fields. It's not expected to happen.


There was an unexpected error while editing data. Contact support.

Errors related to the edit data operation. Usually, because the query is not built properly, an SQL error. It's not expected to happen.

There was an unexpected error while deleting data. Contact support.

Errors related to the delete data operation. Usually, because the query is not built properly, an SQL error. It's not expected to happen.


Configuration not found in the records. Maybe it was deleted manually.This error happens when the backend returns a 404 for the getWidgetConfiguration, because it was not found in the database.


When everything goes as expected.

Manual rollbacks

If the server fails to write the configuration or audit, the widget configuration needs to be manually deleted as well as the audit record to maintain consistency or rollback the creation of the table.

These actions are only to be performed in the event of something unexpected fails.

OperationScenarioRollback instructions
CreateWidgetConfiguration creates a table but isn't able to insert data or edit the datasetLook in the log file or in the audit table to find the name of the table. Beware that it is necessary to remove the table from the widget_config and historical_audit as well as the fields from widget_field_config and widget_field_values. To delete from these last two simply use the WIDGETCONFIGURATION_ID and WIDGETFIELDCONFIGURATION_ID (if there are predefined values), you can find this ID in the widget_config table. Run query "Drop table X" where X is the name of the dataset that was associated with the exception.
DeleteWidgetThe action to delete the dataset failsManually drop the table, similar to the previous instructions.
AlterWidgetPerforming an edit on the dataset throws an error

If a rename of the fields needs to be performed simply go to the desired table and manually rename the columns and then in the widget_field_config also rename the fields.

If a change of type is needed the column must be dropped and added with the new type and then in the widget_field_config the type should also be changed to the new one (TEXT if varchar, NUMERIC if double and PREDEFINED if varchar). Although, in the case of PREDEFINED it is also necessary to add the predefined values to the widget_field_values and have the WIDGETFIELDCONFIGURATION_ID be the foreign key to the predefined field in the widget_field_config.

Back-end Data Processes

It is possible to manually add data to a dataset, for that you can either do a manual insert directly on the database, but bear in mind that the actions won't be stored on the audit table (in the end of the recommended procedure we give some best practices for adding data manually).

The recommended way of doing this is by calling the appendData on the extension API. Here is an example (for this it is recommended the use of Postman or a similar tool):

  1. First, simulate a login:
    1. This is required because in order to perform a call to the API an authorization token needs to be provided. To get a token a login needs to be performed.
    2. The login can be made like so:
      1. API endpoint:  https://extension-url:port/twbe/api/user/login
      2. This endpoint receives a form-data with username and password
      3. And add the following headers. You need to specify to which site your authentication to 
      4. After that, send the request and a token should be received. Save this token.

  2. Then do the request:
    1. API endpoint:  https://extension-url:port/twbe/api/dataset/appendData
    2. This endpoint receives a JSON object
      1. json object appendData
        	"widgetId" : "1234a567-8b91-23c4-5d6e-7f89a0123b45",
        	"records" : [
        				{"field": "Sales", "value": "123456789"},
        				{"field": "Country", "value": "Portugal"},
        				{"field": "Region", "value": "Europe"},
        				{"field": "field 1", "value": "123"}
        				{"field": "Sales", "value": "123456788"},
        				{"field": "Country", "value": "Portugal"},
        				{"field": "Region", "value": "Europe"},
        				{"field": "field 1", "value": "123"}
        	"system" : "direct_call"
        1. widgetId - ID of the dataset (can be found in the widget_config table).
        2. system - usually says "server" or "desktop" but as a good practice, here it is recommended to set "direct_call" to know that it was a direct call to the API.
        3. records - set of records to be inserted.
          1. This will vary according to each use case. In this example, we have the first three fields that represent the keys or pills in the add data screen and the last one (field 1) is the user input field. For the appendData to work it is mandatory that the configured keys are sent in the request. A good way of making sure it goes correctly is to go to the extension, add an entry, and see the result of the request, then copy the JSON that was sent. Example (press F12 on the pop up of the add data before adding data to see the request; only on the browser, i.e, using Tableau Server):
          2. Example in postman:
    3. Besides the JSON object, it is also necessary to provide the Authorization header and the Content-Type and ExntensionSite headers
      1. The Authorization header (that was obtained from the login) is set in the Authorization tab:
      2. The headers tab. You need to specify the extension site you desire to append data to and the content-type:
    4. After this, just send the request and the application will do everything else. It is possible to see the results in the audit table and in the dataset table.
    5. If any error happens when sending the request, here are a few common issues:
      1. Make sure that the keys fields are all being correctly sent.
      2. Make sure that at least one of the user input fields is being sent.
      3. Make sure the authorization header and the content-type are in the request.
      4. Make sure that the body is in the proper format, i. e., JSON.
    6. If there are other issues, check the log files (usually logs/logger.log) to see if there is any issue with the query or temp/spring.log to check if the back end is throwing any error.

Bear in mind that since this is being used manually it is more susceptible to failure because it does not have the control and error handling present when using the extension's front end.

The recommended method is calling the API because this way the logic is handled by the application and it goes through some checks as well as generates the IDs for the records and registers the actions in the historical_audit table.

Adding data to a tabular configuration

When adding data to a tabular configuration, the endpoint is exactly the same and the JSON structure is also the same, but we feel that it is necessary to show how the JSON is built as the tabular submit screen is quite different from the form.

Here's an example of a tabular add data:

This input will result in the following JSON:

Tabular AppendData
    "widgetId": "3722bbfc-7599-44af-965f-dc46eb8dc8c2",
    "system": "direct_call",
    "records": [
                {"field": "Field1", "value": "5000"},
                {"field": "MONTH(Order Date)", "value": 1},
                {"field": "Comment", "value": "Rest API"},
                {"field": "Country", "value": "Spain"}
                {"field": "Field2", "value": "300"},
                {"field": "MONTH(Order Date)", "value": 1  },
                {"field": "Comment", "value": "Rest API"},
                {"field": "Country", "value": "Spain"}
                {"field": "Field2", "value": "3600"},
                {"field": "MONTH(Order Date)", "value": 2},
                {"field": "Comment", "value": "Rest API"},
                {"field": "Country", "value": "Spain"}
                {"field": "Field3", "value": "6000"},
                {"field": "MONTH(Order Date)", "value": 3},
                {"field": "Comment", "value": "Rest API"},
                {"field": "Country", "value": "Spain"}
                {"field": "Field1", "value": "4500"  },
                {"field": "MONTH(Order Date)", "value": 4},
                {"field": "Comment", "value": "Rest API"},
                {"field": "Country", "value": "Spain"}
                {"field": "Field1", "value": "5000"},
                {"field": "MONTH(Order Date)", "value": 1},
                {"field": "Comment", "value": "Rest API"},
                {"field": "Country", "value": "Portugal"}
                {"field": "Field2", "value": "300"},
                {"field": "MONTH(Order Date)", "value": 1},
                {"field": "Comment", "value": "Rest API"},
                {"field": "Country", "value": "Portugal"}
                {"field": "Field2", "value": "3600"},
                {"field": "MONTH(Order Date)", "value": 2},
                {"field": "Comment", "value": "Rest API"},
                {"field": "Country", "value": "Portugal"}
                {"field": "Field3", "value": "6000"},
                {"field": "MONTH(Order Date)", "value": 3},
                {"field": "Comment", "value": "Rest API"},
                {"field": "Country", "value": "Portugal"}
                {"field": "Field1", "value": "4500"},
                {"field": "MONTH(Order Date)", "value": 4},
                {"field": "Comment", "value": "Rest API"},
                {"field": "Country", "value": "Portugal"}

As you can see, very similar, although it creates a bit more data.

This is also because in this scenario we are adding to two marks at the same time.

The records structure is as follows:

Records Stucture
records: [
  [ //mark1
    {"field": "fieldName", "value": "fieldValue"}
    { ... }
  [ //mark2
    {"field": "fieldName", "value": "fieldValue"}
    { ... }

Below, you can find a Postman collection with the endpoints necessary to use the extension thought REST requests.

Write-Back Rest API Postman Collection 

In here you will find the requests already structured and ready to be replaced with your data, including the login where you can get the authorization token.

If you wish to make these changes manually it is also possible. We strongly recommend using the previous method, calling the API, but we will also provide a possible solution to insert directly in the database.

  1. Firstly the insert is made on the dataset table.
    1. Make sure that the key fields are correctly inserted with the proper values otherwise when reading the data in the extension it won't know how to display them and may cause issues.
    2. Example query:

      INSERT INTO "dataset_table"
      ("surrogateKey", "key1", "key2", "key3", "field_1", "field_2", isActive, timestamp_local, timestamp_utc)
      VALUES ('manual-insert-2019-05-23-13-15-00-i', '12345', 'abcdef', '12345', '12345', 'abcdef', TRUE, '2019-05-23 13:15:00', '2019-05-23 13:15:00');

      A few notes to consider:

      1. manual-insert-2019-05-23-13-15-00-i

        This is the format we recommend for this type of insert to ensure a unique key and to be easy to spot these inserts. The format is manual-insert-TIMESTAMP-i where TIMESTAMP is the current time and date and the "i" is an incremental number. This "i" is particularly useful for batch inserts for example.

      2. "key1", "key2", "key3", "field_1", "field_2" - these fields depend on the table columns and the keys configured

      It is also necessary to manually insert the action in the audit table to keep a record of what was made

      INSERT INTO historical_audit 
      VALUES ('manual-audit-2019-05-23-13-15-00-i', '2019-05-23 14:15:00', '2019-05-23 13:15:00', 'INSERT_DATA', 'dataset_table', 'manual-insert-2019-05-23-13:15:00-i++', 'dataset_id', 'manual_insert', 'username');

      A few notes to consider

      1. The ID format is similar to the insert on the dataset table just changing from "insert" to "audit".
      2. DATASET is the name of the table

      3. IDWIDGETCONFIGURATION is the ID of the dataset table and can be found in the widget_config table

      4. DATASETKEY is the surrogateKey of the record

    3. Diagram of the relation between tables/keys:

Querying the Write-Back Tables

If you are directly querying the database either when reading with a database management tool or Tableau itself with a custom SQL connection, you might have to pay attention to the way the columns are called as some of them usually require to be used with double-quotes. 

Example with historical audit: 


Example with a dataset:

SELECT "surrogateKey", categorization, future_action, analysis_date, country, order_date_month__year, "ISACTIVE", "TIMESTAMP_LOCAL", "TIMESTAMP_UTC" FROM outlier_analysis;

As you can see, the technical fields from Write-Back have uppercase letters meaning that they are required to be called with double-quotes.

  • No labels