A walkthrough of using the Migration API to promote a Dashboard from lower (Dev) to higher (Prod) environments

The Migration API

The Migration API is a RESTful interface that allows us to programmatically export and import Dashboards from one Arcadia environment to another as well as other types of Arcadia artifacts.

Prerequisites to using the Migration API

  1. Enable API Keys in Arcadia
  2. Create an API Key for your user running the migration scripts
  3. Create a “Role” with the right permissions for importing Arcadia artifacts
  4. Assign your user running the migration scripts the new Import Role

Enable API Keys in Arcadia

To enable API Keys in Arcadia, you must add the DATA_API_ENABLED=True setting in the Arcadia Visualization Server Safety Valve (settings_cm.py) in CDH, or Arcviz Settings if you’re using Ambari/Hortonworks. Below is an example of how to set this in Cloudera Manager:

22%20PM

Create an API Key for your user running the migration scripts

Once API Keys have been enabled, you will need to an Add an API key for the user that will run the Export/Import Migration scripts. You will need to copy the Secret Key and save it somewhere safe since this is the API Key that will be used inside of your migration scripts.

For more information on creating and managing API keys, see our documentation.

Create a “Role” with the right permissions for importing Arcadia artifacts

By default, any Arcadia user can “Export” a Dashboard they have access to. However, importing Dashboards into Arcadia requires additional permissions because the exported file containing the Dashboard metadata also includes metadata for the Visuals within the dashboards as well as the Dataset, Custom Colors, Custom Dates, and any Custom Styles attached to the Dashboard and/or Visuals. Importing is also done on a Connection basis so the user will need access to any Arcadia Connection that will be imported into. The screenshot below shows an example Role with the permissions needed to import properly into our “Production Arcengine” connection in the target environment (i.e. Prod). This Role would be created within the target environment (i.e. Prod).

NOTE: The “Manage custom styles” permission gives the import user the ability to import Custom Colors, Dates, and Styles attached to the Dashboard and/or Visuals within the Dashboard .

Assign your user running the migration scripts the new Import Role

Users can be assigned to Roles explicitly or through Group membership. Depending on how your users are managed in Arcadia (local accounts, LDAP, SAML, etc.) your association of a user to a Role may be different. Below is how our example user was associated to the new Import Role.

44%20PM

The Export and Import Scripts

Below are examples of both Export and Import scripts using our Migration API. For more information on the Migration API, visit our documentation.

NOTE: You should replace the notations in the script like with values that correspond to your environment.

Export Script example:

import requests
import json
import time

current_unixtimestamp_utc = int(time.time())

### SETUP #####
## Provide Arcadia Configuration

arcadia_url = 'http://<your hostname>:38888'
arcadia_apikey_secret = '<your api key secret>'
dashboard_id_list = [<your dashboard id>, <your dashboard id>,...]
# Append dashboard filename with current unix timestamp
dashboard_export_filename = '<your dashboard name>' + '_' + str(current_unixtimestamp_utc) + '.json'

### EXPORT DASHBOARDS ###
## Setup Import Request

headers= {'AUTHORIZATION':'apikey ' + arcadia_apikey_secret}
payload = {'dashboards': json.dumps(dashboard_id_list), 'filename': dashboard_export_filename, 'dry_run': 'False'}
r = requests.get(arcadia_url + '/arc/migration/api/export/', params=payload, headers=headers)

## Check Response

if int(r.status_code) == 200:
	print("[+] INFO - EXPORT SUCCESS")
	print("[+] INFO - Response Status: %s" % r.status_code)
	# Write JSON file for export on request success
	with open(dashboard_export_filename, 'w') as f:
		f.write(r.text)
else:
	print("[!] ERROR - EXPORT FAILURE")
	print("[!] ERROR - Response Status: %s" % r.status_code)
	print("--------------------------")
	print(r._content)

Import Script example:

import requests

### SETUP #####
## Provide Arcadia Configuration

arcadia_url = 'http://<your hostname>:38888'
arcadia_apikey_secret = '<your api key secret>'
arcadia_dataconnection_name = '<your Arcadia connection name>'
json_file_path = '<your path to the exported JSON file>/<dashboard JSON filename>.json'


### IMPORT DASHBOARDS ###
## Setup Import Request

headers= {'AUTHORIZATION':'apikey ' + arcadia_apikey_secret}
payload = {'dry_run': False, 'dataconnection_name': arcadia_dataconnection_name}
files = {'import_file': open(json_file_path,'r')}
r = requests.post(arcadia_url + '/arc/migration/api/import/',files=files, data=payload, headers=headers)

## Check Response

if int(r.status_code) == 200:
	print("[+] INFO - IMPORT SUCCESS")
	print("[+] INFO - Response Status: %s" % r.status_code)
else:
	print("[!] ERROR - IMPORT FAILURE")
	print("[!] ERROR - Response Status: %s" % r.status_code)
	print("--------------------------")
	print(r._content)

An example of automated Exporting and Importing

Below is a Customer Dashboard that we generated in our lower environment (i.e. Dev), and is now ready to be promoted to Production.

Before we prepare the migration scripts, we need to find the ID of the Dashboard we plan to export. This can be done by locating the ID at the end of the Dashboard’s URL:

55%20PM

Now I can update my previous script to include the ID for our Dashboard to be exported:

import requests
import json
import time

current_unixtimestamp_utc = int(time.time())

### SETUP #####
## Provide Arcadia Configuration

arcadia_url = 'http://<my dev hostname>:38888'
arcadia_apikey_secret = '<my dev secret api key>'
dashboard_id_list = [12034]
# Append dashboard filename with current unix timestamp
dashboard_export_filename = 'customerdashboard' + '_' + str(current_unixtimestamp_utc) + '.json'

### EXPORT DASHBOARDS ###
## Setup Import Request

headers= {'AUTHORIZATION':'apikey ' + arcadia_apikey_secret}
payload = {'dashboards': json.dumps(dashboard_id_list), 'filename': dashboard_export_filename, 'dry_run': 'False'}
r = requests.get(arcadia_url + '/arc/migration/api/export/', params=payload, headers=headers)

## Check Response

if int(r.status_code) == 200:
	print("[+] INFO - EXPORT SUCCESS")
	print("[+] INFO - Response Status: %s" % r.status_code)
	# Write JSON file for export on request success
	with open(dashboard_export_filename, 'w') as f:
		f.write(r.text)
else:
	print("[!] ERROR - EXPORT FAILURE")
	print("[!] ERROR - Response Status: %s" % r.status_code)
	print("--------------------------")
	print(r._content)

When the script executes I should see something like this:

$ python3 arcadia_export_api_example.py 
[+] INFO - EXPORT SUCCESS
[+] INFO - Response Status: 200

And I should now have a JSON file containing the exported Dashboard (customerdashboard_1551130456.json), as well as the Visuals, Custom Color Palette, and Custom Style that was attached to our top left Visual:

20%20PM

Now in my target environment (i.e. Production), my test user (taddtest3) has access to a Connection (Production Arcengine), but currently there aren’t any Dashboards that have been migrated:

To migrate our Dashboard (and attached Visuals, Dataset, Color Palette, and Custom Style), I need to update our Import Script to point at the exported JSON filename, and also the name of our Connection in the Arcadia Production environments (Production Arcengine):

import requests

### SETUP #####
## Provide Arcadia Configuration

arcadia_url = 'http://<my production hostname>:38888'
arcadia_apikey_secret = '<my production api key secret>'
arcadia_dataconnection_name = 'Production Arcengine'
json_file_path = '/Users/myuser/Downloads/customerdashboard_1551130456.json'


### IMPORT DASHBOARDS ###
## Setup Import Request

headers= {'AUTHORIZATION':'apikey ' + arcadia_apikey_secret}
payload = {'dry_run': False, 'dataconnection_name': arcadia_dataconnection_name}
files = {'import_file': open(json_file_path,'r')}
r = requests.post(arcadia_url + '/arc/migration/api/import/',files=files, data=payload, headers=headers)

## Check Response

if int(r.status_code) == 200:
	print("[+] INFO - IMPORT SUCCESS")
	print("[+] INFO - Response Status: %s" % r.status_code)
else:
	print("[!] ERROR - IMPORT FAILURE")
	print("[!] ERROR - Response Status: %s" % r.status_code)
	print("--------------------------")
	print(r._content)

When this script executes I should see something like this:

$ python3 arcadia_import_api_example.py 
[+] INFO - IMPORT SUCCESS
[+] INFO - Response Status: 200

And now if I check our Production Arcadia environment, I can now see the imported Dataset, which is attached to the Dashboard as well as Visuals, Color Palette, and Custom Style that came along with the import.

Exporting and Importing an existing Dashboard into Production with changes made from a lower environment (i.e. Dev).

Let’s say in our previous customer Dashboard we wanted to add a few filters to make the Dashboard more dynamic:

40%20PM

With these changes made, we will need to re-run our export and import scripts to ensure that our Production environment has the latest copy of our Customer Dashboard. When we re-run our export script, we now have a new JSON file with a later timestamp: customerdashboard_1551133456.json

We then use this new JSON filename within our import script, and re-run it to migrate the Customer Dashboard once again into Production. You’ll notice after the script has completed that no duplication has occurred of the existing Dataset, Dashboard, Visuals, Color Palette, or Custom Style in Production.

And as expected, the newest version of your Dashboard is now available in your Production Environment:

Arcadia uses a unique signature to synchronize objects between environments to avoid duplications during the migration process. See this post for more information on the export and import process between Arcadia environments.

1 Like

Below are also some additional arguments that can be passed along in the Export and Import API endpoint payloads:

Export API Arguments

'dry_run': Perform a test run of the migration process confirming the actions that will occur when the Dashboard and other objects are exported. When you set 'dry_run'='False' it will perform the actual export process. (default value = 'True')

Here’s a comparison of the with what you would see in the UI when confirming the import and the raw JSON response content from the 'dry_run'='True'

{'appgroupmembership': [],
 'appgroups': [],
 'colorpalette': [{'id': 25, 'name': 'Custom Color Palette Example'}],
 'customcss': [{'id': 210, 'name': 'simple_js_custom_style_example'}],
 'dashboards': [{'id': 12034, 'name': 'Customer Sales Dashboard'}],
 'datasets': [{'id': 981, 'name': 'Superstore Sales (Tadd)'}],
 'dateranges': [],
 'events': [],
 'reportimage': [{'id': 7581, 'name': 'reportimage'},
                 {'id': 7580, 'name': 'reportimage'},
                 {'id': 7582, 'name': 'reportimage'},
                 {'id': 7583, 'name': 'reportimage'}],
 'segments': [{'id': 1118, 'name': 'Join Filter'},
              {'id': 1119, 'name': 'Region'}],
 'staticasset': [],
 'visuals': [{'id': 12033, 'name': 'Top 50 Customers', 'type': 'trellis-bars'},
             {'id': 12035,
              'name': 'Product Purchases:\xa0 <<customer_name:>>',
              'type': 'treemap'},
             {'id': 12036,
              'name': 'Purchase Timeline:\xa0 <<customer_name:>>',
              'type': 'calendar-heatmap'},
             {'id': 12045, 'name': '', 'type': 'picklist'},
             {'id': 12046, 'name': '', 'type': 'picklist'}]}

Import API Arguments

'sanity_check': Check if table exists (table(s) from dataset are expected to have the same name between environments) in target environment. If table does not exist, then the import will fail. (default value = 'True')

'dry_run': Perform a test run of the migration process confirming the actions that will occur when the Dashboard and other objects are imported. When you set 'dry_run'='False' it will perform the actual import process. (default value = 'True')

'workspace': The workspace you will import your Dashboard into. (default value = 'Private')

'limit_search_workspace': When this option is set to ‘True’, it will ignore other copies of your Dashboard that reside in other workspaces, and create a new copy of that Dashboard in your Workspace of choice. (default value = 'False')

'skip_thumbnails': If set to ‘True’ it will skip the import process for thumbnails generated for your Dashboard. Default behavior is to import thumbnail image data for your Dashboard from the JSON file. (default value = 'False')

'generate_thumbnails': If set to ‘True’ it the import process will generate brand new thumbnails for your Dashboard. Default behavior is to import thumbnail image data for your Dashboard from the JSON file. (default value = 'False')

Below are screenshots of how you can tie these arguments to various functionality within the Arcadia UI:

51%20AM

56%20AM

And also here’s a comparison of the JSON response from the 'dry_run'='True' with what you would see in the UI when confirming the import:

00%20AM

{'appgroupmembership': [],
 'appgroups': [],
 'colorpalette': [{'action': {'create': False,
                              'data_update': False,
                              'thumbnail_update': False,
                              'workspace_update': False},
                   'id': 26,
                   'name': 'Custom Color Palette Example'}],
 'customcss': [{'action': {'create': False,
                           'data_update': False,
                           'thumbnail_update': False,
                           'workspace_update': False},
                'id': 334,
                'name': 'simple_js_custom_style_example'}],
 'dashboards': [{'action': {'create': True,
                            'data_update': True,
                            'thumbnail_update': False,
                            'workspace_update': True},
                 'id': None,
                 'name': 'Customer Sales Dashboard'}],
 'datasets': [{'action': {'create': True,
                          'data_update': True,
                          'thumbnail_update': False,
                          'workspace_update': False},
               'id': None,
               'name': 'Superstore Sales (Tadd)'}],
 'dateranges': [],
 'events': [],
 'reportimage': [{'action': {'create': True,
                             'data_update': True,
                             'thumbnail_update': False,
                             'workspace_update': False},
                  'id': None,
                  'name': 'reportimage'},
                 {'action': {'create': True,
                             'data_update': True,
                             'thumbnail_update': False,
                             'workspace_update': False},
                  'id': None,
                  'name': 'reportimage'},
                 {'action': {'create': True,
                             'data_update': True,
                             'thumbnail_update': False,
                             'workspace_update': False},
                  'id': None,
                  'name': 'reportimage'},
                 {'action': {'create': True,
                             'data_update': True,
                             'thumbnail_update': False,
                             'workspace_update': False},
                  'id': None,
                  'name': 'reportimage'}],
 'segments': [{'action': {'create': True,
                          'data_update': True,
                          'thumbnail_update': False,
                          'workspace_update': False},
               'id': None,
               'name': 'Join Filter'},
              {'action': {'create': True,
                          'data_update': True,
                          'thumbnail_update': False,
                          'workspace_update': False},
               'id': None,
               'name': 'Region'}],
 'staticasset': [],
 'visuals': [{'action': {'create': True,
                         'data_update': True,
                         'thumbnail_update': False,
                         'workspace_update': True},
              'id': None,
              'name': 'Top 50 Customers',
              'type': 'trellis-bars'},
             {'action': {'create': True,
                         'data_update': True,
                         'thumbnail_update': False,
                         'workspace_update': True},
              'id': None,
              'name': 'Product Purchases:\xa0 <<customer_name:>>',
              'type': 'treemap'},
             {'action': {'create': True,
                         'data_update': True,
                         'thumbnail_update': False,
                         'workspace_update': True},
              'id': None,
              'name': 'Purchase Timeline:\xa0 <<customer_name:>>',
              'type': 'calendar-heatmap'},
             {'action': {'create': True,
                         'data_update': True,
                         'thumbnail_update': False,
                         'workspace_update': True},
              'id': None,
              'name': '',
              'type': 'picklist'},
             {'action': {'create': True,
                         'data_update': True,
                         'thumbnail_update': False,
                         'workspace_update': True},
              'id': None,
              'name': '',
              'type': 'picklist'}]}
2 Likes